119 research outputs found

    A circuit mechanism for independent modulation of excitatory and inhibitory firing rates after sensory deprivation

    Get PDF
    Diverse interneuron subtypes shape sensory processing in mature cortical circuits. During development, sensory deprivation evokes powerful synaptic plasticity that alters circuitry, but how different inhibitory subtypes modulate circuit dynamics in response to this plasticity remains unclear. We investigate how deprivation-induced synaptic changes affect excitatory and inhibitory firing rates in a microcircuit model of the sensory cortex with multiple interneuron subtypes. We find that with a single interneuron subtype (parvalbumin-expressing [PV]), excitatory and inhibitory firing rates can only be comodulated-increased or decreased together. To explain the experimentally observed independent modulation, whereby one firing rate increases and the other decreases, requires strong feedback from a second interneuron subtype (somatostatin-expressing [SST]). Our model applies to the visual and somatosensory cortex, suggesting a general mechanism across sensory cortices. Therefore, we provide a mechanistic explanation for the differential role of interneuron subtypes in regulating firing rates, contributing to the already diverse roles they serve in the cortex

    Emergence of synaptic organization and computation in dendrites

    Get PDF
    Single neurons in the brain exhibit astounding computational capabilities, which gradually emerge throughout development and enable them to become integrated into complex neural circuits. These capabilities derive in part from the precise arrangement of synaptic inputs on the neurons’ dendrites. While the full computational benefits of this arrangement are still unknown, a picture emerges in which synapses organize according to their functional properties across multiple spatial scales. In particular, on the local scale (tens of microns), excitatory synaptic inputs tend to form clusters according to their functional similarity, whereas on the scale of individual dendrites or the entire tree, synaptic inputs exhibit dendritic maps where excitatory synapse function varies smoothly with location on the tree. The development of this organization is supported by inhibitory synapses, which are carefully interleaved with excitatory synapses and can flexibly modulate activity and plasticity of excitatory synapses. Here, we summarize recent experimental and theoretical research on the developmental emergence of this synaptic organization and its impact on neural computations

    Implications of single-neuron gain scaling for information transmission in networks

    Get PDF
    Summary: 

Many neural systems are equipped with mechanisms to efficiently encode sensory information. To represent natural stimuli with time-varying statistical properties, neural systems should adjust their gain to the inputs' statistical distribution. Such matching of dynamic range to input statistics has been shown to maximize the information transmitted by the output spike trains (Brenner et al., 2000, Fairhall et al., 2001). Gain scaling has not only been observed as a system response property, but also in single neurons in developing somatosensory cortex stimulated with currents of different amplitude (Mease et al., 2010). While gain scaling holds for cortical neurons at the end of the first post-natal week, at birth these neurons lack this property. The observed improvement in gain scaling coincides with the disappearance of spontaneous waves of activity in cortex (Conheim et al., 2010).

We studied how single-neuron gain scaling affects the dynamics of signal transmission in networks, using the developing cortex as a model. In a one-layer feedforward network, we showed that the absence of gain control made the network relatively insensitive to uncorrelated local input fluctuations. As a result, these neurons selectively and synchronously responded to large slowly-varying correlated input--the slow build up of synaptic noise generated in pacemaker circuits which most likely triggers waves. Neurons in gain scaling networks were more sensitive to the small-scale input fluctuations, and responded asynchronously to the slow envelope. Thus, gain scaling both increases information in individual neurons about private inputs and allows the population average to encode the slow fluctuations in the input. Paradoxically, the synchronous firing that corresponds to wave propagation is associated with low information transfer. We therefore suggest that the emergence of gain scaling may help the system to increase information transmission on multiple timescales as sensory stimuli become important later in development. 

Methods:

Networks with one and two layers consisting of hundreds of model neurons were constructed. The ability of single neurons to gain scale was controlled by changing the ratio of sodium to potassium conductances in Hodgkin-Huxley neurons (Mainen et al., 1995). The response of single layer networks was studied with ramp-like stimuli with slopes that varied over several hundreds of milliseconds. Fast fluctuations were superimposed on this slowly-varying mean. Then the response to these networks was tested with continuous stimuli. Gain scaling networks captured the slow fluctuations in the inputs, while non-scaling networks simply thresholded the input. Quantifying information transmission confirmed that gain scaling neurons transmit more information about the stimulus. With the two-layer networks we simulated a cortical network where waves could spontaneously emerge, propagate and degrade, based on the gain scaling properties of the neurons in the network

    Regulation of circuit organization and function through inhibitory synaptic plasticity

    Get PDF
    Diverse inhibitory neurons in the mammalian brain shape circuit connectivity and dynamics through mechanisms of synaptic plasticity. Inhibitory plasticity can establish excitation/inhibition (E/I) balance, control neuronal firing, and affect local calcium concentration, hence regulating neuronal activity at the network, single neuron, and dendritic level. Computational models can synthesize multiple experimental results and provide insight into how inhibitory plasticity controls circuit dynamics and sculpts connectivity by identifying phenomenological learning rules amenable to mathematical analysis. We highlight recent studies on the role of inhibitory plasticity in modulating excitatory plasticity, forming structured networks underlying memory formation and recall, and implementing adaptive phenomena and novelty detection. We conclude with experimental and modeling progress on the role of interneuron-specific plasticity in circuit computation and context-dependent learning

    Formation and computational implications of assemblies in neural circuits

    Get PDF
    In the brain, patterns of neural activity represent sensory information and store it in non-random synaptic connectivity. A prominent theoretical hypothesis states that assemblies, groups of neurons that are strongly connected to each other, are the key computational units underlying perception and memory formation. Compatible with these hypothesised assemblies, experiments have revealed groups of neurons that display synchronous activity, either spontaneously or upon stimulus presentation, and exhibit behavioural relevance. While it remains unclear how assemblies form in the brain, theoretical work has vastly contributed to the understanding of various interacting mechanisms in this process. Here, we review the recent theoretical literature on assembly formation by categorising the involved mechanisms into four components: synaptic plasticity, symmetry breaking, competition and stability. We highlight different approaches and assumptions behind assembly formation and discuss recent ideas of assemblies as the key computational unit in the brain

    The generation of cortical novelty responses through inhibitory plasticity

    Get PDF
    Animals depend on fast and reliable detection of novel stimuli in their environment. Neurons in multiple sensory areas respond more strongly to novel in comparison to familiar stimuli. Yet, it remains unclear which circuit, cellular, and synaptic mechanisms underlie those responses. Here, we show that spike-timing-dependent plasticity of inhibitory-to-excitatory synapses generates novelty responses in a recurrent spiking network model. Inhibitory plasticity increases the inhibition onto excitatory neurons tuned to familiar stimuli, while inhibition for novel stimuli remains low, leading to a network novelty response. The generation of novelty responses does not depend on the periodicity but rather on the distribution of presented stimuli. By including tuning of inhibitory neurons, the network further captures stimulus-specific adaptation. Finally, we suggest that disinhibition can control the amplification of novelty responses. Therefore, inhibitory plasticity provides a flexible, biologically plausible mechanism to detect the novelty of bottom-up stimuli, enabling us to make experimentally testable predictions

    Turing's model for biological pattern formation and the robustness problem

    Get PDF
    One of the fundamental questions in developmental biology is how the vast range of pattern and structure we observe in nature emerges from an almost uniformly homogeneous fertilized egg. In particular, the mechanisms by which biological systems maintain robustness, despite being subject to numerous sources of noise, are shrouded in mystery. Postulating plausible theoretical models of biological heterogeneity is not only difficult, but it is also further complicated by the problem of generating robustness, i.e. once we can generate a pattern, how do we ensure that this pattern is consistently reproducible in the face of perturbations to the domain, reaction time scale, boundary conditions and so forth. In this paper, not only do we review the basic properties of Turing's theory, we highlight the successes and pitfalls of using it as a model for biological systems, and discuss emerging developments in the area

    Homeostatic Activity-Dependent Tuning of Recurrent Networks for Robust Propagation of Activity.

    Get PDF
    UNLABELLED: Developing neuronal networks display spontaneous bursts of action potentials that are necessary for circuit organization and tuning. While spontaneous activity has been shown to instruct map formation in sensory circuits, it is unknown whether it plays a role in the organization of motor networks that produce rhythmic output. Using computational modeling, we investigate how recurrent networks of excitatory and inhibitory neuronal populations assemble to produce robust patterns of unidirectional and precisely timed propagating activity during organism locomotion. One example is provided by the motor network inDrosophilalarvae, which generates propagating peristaltic waves of muscle contractions during crawling. We examine two activity-dependent models, which tune weak network connectivity based on spontaneous activity patterns: a Hebbian model, where coincident activity in neighboring populations strengthens connections between them; and a homeostatic model, where connections are homeostatically regulated to maintain a constant level of excitatory activity based on spontaneous input. The homeostatic model successfully tunes network connectivity to generate robust activity patterns with appropriate timing relationships between neighboring populations. These timing relationships can be modulated by the properties of spontaneous activity, suggesting its instructive role for generating functional variability in network output. In contrast, the Hebbian model fails to produce the tight timing relationships between neighboring populations required for unidirectional activity propagation, even when additional assumptions are imposed to constrain synaptic growth. These results argue that homeostatic mechanisms are more likely than Hebbian mechanisms to tune weak connectivity based on spontaneous input in a recurrent network for rhythm generation and robust activity propagation. SIGNIFICANCE STATEMENT: How are neural circuits organized and tuned to maintain stable function and produce robust output? This task is especially difficult during development, when circuit properties change in response to variable environments and internal states. Many developing circuits exhibit spontaneous activity, but its role in the synaptic organization of motor networks that produce rhythmic output is unknown. We studied a model motor network, that when appropriately tuned, generates propagating activity as during crawling inDrosophilalarvae. Based on experimental evidence of activity-dependent tuning of connectivity, we examined plausible mechanisms by which appropriate connectivity emerges. Our results suggest that activity-dependent homeostatic mechanisms are better suited than Hebbian mechanisms for organizing motor network connectivity, and highlight an important difference from sensory areas.This work was supported by Cambridge Overseas Research Fund, Trinity College, and Swartz Foundation to J.G. and Wellcome Trust VIP funding to J.F.E. through Program Grant WT075934 to Michael Bate and Matthias Landgraf. J.G. is also supported by Burroughs-Wellcome Fund Career Award at the Scientific Interface.This is the final version of the article. It first appeared from the Society for Neuroscience via https://doi.org/10.1523/JNEUROSCI.2511-15.201

    Intrinsic Neuronal Properties Switch the Mode of Information Transmission in Networks

    Get PDF
    Diverse ion channels and their dynamics endow single neurons with complex biophysical properties. These properties determine the heterogeneity of cell types that make up the brain, as constituents of neural circuits tuned to perform highly specific computations. How do biophysical properties of single neurons impact network function? We study a set of biophysical properties that emerge in cortical neurons during the first week of development, eventually allowing these neurons to adaptively scale the gain of their response to the amplitude of the fluctuations they encounter. During the same time period, these same neurons participate in large-scale waves of spontaneously generated electrical activity. We investigate the potential role of experimentally observed changes in intrinsic neuronal properties in determining the ability of cortical networks to propagate waves of activity. We show that such changes can strongly affect the ability of multi-layered feedforward networks to represent and transmit information on multiple timescales. With properties modeled on those observed at early stages of development, neurons are relatively insensitive to rapid fluctuations and tend to fire synchronously in response to wave-like events of large amplitude. Following developmental changes in voltage-dependent conductances, these same neurons become efficient encoders of fast input fluctuations over few layers, but lose the ability to transmit slower, population-wide input variations across many layers. Depending on the neurons' intrinsic properties, noise plays different roles in modulating neuronal input-output curves, which can dramatically impact network transmission. The developmental change in intrinsic properties supports a transformation of a networks function from the propagation of network-wide information to one in which computations are scaled to local activity. This work underscores the significance of simple changes in conductance parameters in governing how neurons represent and propagate information, and suggests a role for background synaptic noise in switching the mode of information transmission

    Valence and State-Dependent Population Coding in Dopaminergic Neurons in the Fly Mushroom Body

    No full text
    Neuromodulation permits flexibility of synapses, neural circuits, and ultimately behavior. One neuromodulator, dopamine, has been studied extensively in its role as a reward signal during learning and memory across animal species. Newer evidence suggests that dopaminergic neurons (DANs) can modulate sensory perception acutely, thereby allowing an animal to adapt its behavior and decision making to its internal and behavioral state. In addition, some data indicate that DANs are not homogeneous but rather convey different types of information as a heterogeneous population. We have investigated DAN population activity and how it could encode relevant information about sensory stimuli and state by taking advantage of the confined anatomy of DANs innervating the mushroom body (MB) of the fly Drosophila melanogaster. Using in vivo calcium imaging and a custom 3D image registration method, we found that the activity of the population of MB DANs encodes innate valence information of an odor or taste as well as the physiological state of the animal. Furthermore, DAN population activity is strongly correlated with movement, consistent with a role of dopamine in conveying behavioral state to the MB. Altogether, our data and analysis suggest that DAN population activities encode innate odor and taste valence, movement, and physiological state in a MB-compartment-specific manner. We propose that dopamine shapes innate perception through combinatorial population coding of sensory valence, physiological, and behavioral context
    • …
    corecore